Motivated by recent progress in using restricted Boltzmann machines aspreprocessing algorithms for deep neural network, we revisit the mean-fieldequations (belief-propagation and TAP equations) in the best understood suchmachine, namely the Hopfield model of neural networks, and we explicit how theycan be used as iterative message-passing algorithms, providing a fast method tocompute the local polarizations of neurons. In the "retrieval phase" whereneurons polarize in the direction of one memorized pattern, we point out amajor difference between the belief propagation and TAP equations : the set ofbelief propagation equations depends on the pattern which is retrieved, whileone can use a unique set of TAP equations. This makes the latter method muchbetter suited for applications in the learning process of restricted Boltzmannmachines. In the case where the patterns memorized in the Hopfield model arenot independent, but are correlated through a combinatorial structure, we showthat the TAP equations have to be modified. This modification can be seeneither as an alteration of the reaction term in TAP equations, or, moreinterestingly, as the consequence of message passing on a graphical model withseveral hidden layers, where the number of hidden layers depends on the depthof the correlations in the memorized patterns. This layered structure isactually necessary when one deals with more general restricted Boltzmannmachines.
展开▼